Provable Low Rank Phase Retrieval
نویسندگان
چکیده
منابع مشابه
Provable Low-Rank Tensor Recovery
In this paper, we rigorously study tractable models for provably recovering low-rank tensors. Unlike their matrix-based predecessors, current convex approaches for recovering low-rank tensors based on incomplete (tensor completion) and/or grossly corrupted (tensor robust principal analysis) observations still suffer from the lack of theoretical guarantees, although they have been used in variou...
متن کاملProvable Models for Robust Low-Rank Tensor Completion
In this paper, we rigorously study tractable models for provably recovering low-rank tensors. Unlike their matrix-based predecessors, current convex approaches for recovering low-rank tensors based on incomplete (tensor completion) and/or grossly corrupted (tensor robust principal analysis) observations still suffer from the lack of theoretical guarantees, although they have been used in variou...
متن کاملProvable Non-convex Phase Retrieval with Outliers: Median TruncatedWirtinger Flow
Solving systems of quadratic equations is a central problem in machine learning and signal processing. One important example is phase retrieval, which aims to recover a signal from only magnitudes of its linear measurements. This paper focuses on the situation when the measurements are corrupted by arbitrary outliers, for which the recently developed non-convex gradient descent Wirtinger flow (...
متن کاملRetrieval Performance Improvement through Low Rank Corrections
Whenever a feature extracted from an image has a unimodal distribution, information about its covariance matrix can be exploited for content-based retrieval using as dissimilarity measure the Bhattacharyya distance. To reduce the amount of computations and the size of logical database entry, we approximate the Bhattacharyya distance taking into account that most of the energy in the feature spa...
متن کاملTowards Provable Learning of Polynomial Neural Networks Using Low-Rank Matrix Estimation
We study the problem of (provably) learning the weights of a two-layer neural network with quadratic activations. In particular, we focus on the under-parametrized regime where the number of neurons in the hidden layer is (much) smaller than the dimension of the input. Our approach uses a lifting trick, which enables us to borrow algorithmic ideas from low-rank matrix estimation. In this contex...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Information Theory
سال: 2020
ISSN: 0018-9448,1557-9654
DOI: 10.1109/tit.2020.2984478